Strain-minimizing hyperbolic network embeddings with landmarks
نویسندگان
چکیده
Abstract We introduce L-hydra (landmarked hyperbolic distance recovery and approximation), a method for embedding network- or distance-based data into space, which requires only the measurements to few ‘landmark nodes’. This landmark heuristic makes applicable large-scale graphs improves upon previously introduced methods. As mathematical justification, we show that point configuration in $d$-dimensional space can be perfectly recovered (up isometry) from just $d+1$ landmarks. also solves two-stage strain-minimization problem, similar our previous (unlandmarked) ‘hydra’. Testing on real network data, is an order of magnitude faster than existing methods scales linearly number nodes. While error higher methods, extension, L-hydra+, outperforms both runtime quality.
منابع مشابه
Representation Tradeoffs for Hyperbolic Embeddings
Hyperbolic embeddings offer excellent quality with few dimensions when embedding hierarchical data structures like synonym or type hierarchies. Given a tree, we give a combinatorial construction that embeds the tree in hyperbolic space with arbitrarily low distortion without using optimization. On WordNet, our combinatorial embedding obtains a mean-average-precision of 0.989 with only two dimen...
متن کاملGroups with no coarse embeddings into hyperbolic groups
We introduce an obstruction to the existence of a coarse embedding of a given group or space into a hyperbolic group, or more generally into a hyperbolic graph of bounded degree. The condition we consider is “admitting exponentially many fat bigons”, and it is preserved by a coarse embedding between graphs with bounded degree. Groups with exponential growth and linear divergence (such as direct...
متن کاملNeural Embeddings of Graphs in Hyperbolic Space
ABSTRACT Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted signi cant amounts of research into applications in domains other than language. One such domain is graph-stru...
متن کاملEquivariant Embeddings of Trees into Hyperbolic Spaces
For every cardinal α ≥ 2 there are three complete constant curvature model manifolds of Hilbert dimension α: the sphere S, the Euclidean space E and the hyperbolic space H. Studying isometric actions on these spaces corresponds in the first case to studying orthogonal representations and in the second case to studying cohomology in degree one with orthogonal representations as coefficients. In ...
متن کاملNeural Embeddings of Graphs in Hyperbolic Space
ABSTRACT Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted signi cant amounts of research into applications in domains other than language. One such domain is graph-stru...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Complex Networks
سال: 2022
ISSN: ['2051-1310', '2051-1329']
DOI: https://doi.org/10.1093/comnet/cnad002